Current:Home > InvestNew Jersey teen sues classmate for allegedly creating, sharing fake AI nudes -CapitalSource
New Jersey teen sues classmate for allegedly creating, sharing fake AI nudes
View
Date:2025-04-16 05:02:11
A New Jersey teen is suing a classmate for allegedly creating and sharing AI-generated pornographic images of herself and other classmates.
A male classmate used an "AI application or website" to alter photos of the 15-year-old, who is identified only as Jane Doe because she is a minor, and other female classmates at Westfield High School, according to a federal lawsuit, filed in the United States District Court District of New Jersey. The photos were initially shared on Instagram.
In all of the photos, Jane Doe and the other girls were clothed, but the AI application digitally removed the clothing and created new images that made the girls appear nude. Their faces remained easily identifiable, the lawsuit said.
"These nude photos of Jane Doe and other minor girls are virtually indistinguishable from real, unaltered photos," the lawsuit said.
The classmate who allegedly made the images then shared the edited photos with fellow classmates and "possibly others," the lawsuit said, using the Internet and Snapchat to distribute them during the summer of 2023. Snapchat's parent company, Snap, told CBS News that their policies prohibit the sharing of such images, and that their app cannot be used to create them.
"We have zero tolerance for the sexual exploitation of any member of our community," Snap said in a statement.
Jane Doe and her family learned about the images in October 2023, when her parents, who were also not identified in the lawsuit, were contacted by her Union County high school. The school's assistant principal said that officials were aware of the images and had confirmed that Jane Doe was a "victim," the suit said. According to the assistant principal, a student had called into the school office to alert officials about seeing nude photos of Jane Doe.
The defendant's father also reached out to Jane Doe's parents, according to the lawsuit. Jane Doe's parents "immediately cooperated with an investigation launched by the Westfield Police Department," but charges were not pursued because the information gathered by school officials could not be used in the investigation.
In addition the "defendant and other potential witnesses failed to cooperate with, speak to, or provide access to their electronic devices to law enforcement." Law enforcement was not able to determine how widely the photos had been shared, or ensure that the photos were deleted and not shared further, the lawsuit said.
"Victims of child and nonconsensual pornography in which their actual faces appear, including Jane Doe, are not only harmed and violated by the creation of such images, but they are also haunted for the rest of their lives by knowing that they were and likely will continue to be exploited for the sexual gratification of others and that, absent court intervention, there is an everlasting threat that such images will be circulated in the future," the lawsuit said.
Jane Doe "suffered and will continue to suffer substantial" reputational and psychological harm because of the photos, the lawsuit said, and she has dealt with "substantial emotional distress, mental anguish, anxiety, embarrassment, shame, humiliation" and "injuries and harms for which there is no adequate remedy at law" since learning about the photos.
The lawsuit requested that Jane Doe receive damages of $150,000 for each disclosure of a nude image, compensatory and punitive damages to be determined at trial, and a temporary restraining order or an injunction preventing the defendant from sharing the images or disclosing the identity of Jane Doe and her family. The defendant would be required to transfer all of the images to Jane Doe, and then permanently delete and destroy any copies of the images.
Shane Vogt, a lawyer representing Jane Doe, told CBS News that he hopes the case "is successful and will demonstrate that there is something victims can do to protect themselves from the AI pornography epidemic."
Several states have passed laws to try to combat the spread of AI-generated pornographic images and criminalize the images – as its usage has soared. In New Jersey, a bill is in the works to ban deepfake pornography and impose a fine, jail time or both on those who share the altered images. President Joe Biden shared an executive order in October, that called for banning the use of generative AI to produce child sexual abuse material or non-consensual pornography.
- In:
- New Jersey
- Deepfake
- Pornography
- Artificial Intelligence
- AI
Kerry Breen is a reporter and news editor at CBSNews.com. A graduate of New York University's Arthur L. Carter School of Journalism, she previously worked at NBC News' TODAY Digital. She covers current events, breaking news and issues including substance use.
TwitterveryGood! (95)
Related
- California DMV apologizes for license plate that some say mocks Oct. 7 attack on Israel
- Midwest Maple Syrup Producers Adapt to Record Warm Winter, Uncertainty as Climate Changes
- About 90,000 tiki torches sold at BJ's are being recalled due to a burn hazard
- Funeral held for slain New York City police Officer Jonathan Diller
- Israel lets Palestinians go back to northern Gaza for first time in over a year as cease
- Nicholas Galitzine talks about transitioning from roles in historical dramas to starring in a modern romance
- Mother says she wants justice after teen son is killed during police chase in Mississippi
- 2 police officers shot in Nevada city. SWAT team surrounds home where suspect reportedly holed up
- Justice Department, Louisville reach deal after probe prompted by Breonna Taylor killing
- High winds and turbulence force flight from Israel to New Jersey to be diverted to New York state
Ranking
- Retirement planning: 3 crucial moves everyone should make before 2025
- When is Passover 2024? What you need to know about the Jewish holiday
- Harvard says it has removed human skin from the binding of a 19th century book
- 4th person charged in ambush that helped Idaho prison inmate escape from Boise hospital
- Angelina Jolie nearly fainted making Maria Callas movie: 'My body wasn’t strong enough'
- High winds and turbulence force flight from Israel to New Jersey to be diverted to New York state
- Powell says Fed wants to see ‘more good inflation readings’ before it can cut rates
- Brittney Griner re-signs with the Phoenix Mercury, will return for 11th season in WNBA
Recommendation
Sarah J. Maas books explained: How to read 'ACOTAR,' 'Throne of Glass' in order.
The Biden Administration Adds Teeth Back to Endangered Species Act Weakened Under Trump
Are grocery stores open Easter 2024? See details for Costco, Kroger, Aldi, Publix, more
'Princess Peach: Showtime!': Stylish, fun Nintendo game lets Peach sparkle in spotlight
Cincinnati Bengals quarterback Joe Burrow owns a $3 million Batmobile Tumbler
Messi injury update: Out for NYCFC match. Will Inter Miami star be ready for Monterrey?
A man suspected of holding 4 hostages for hours in a Dutch nightclub has been arrested
Fans believe Taylor Swift sings backup on Beyoncé's new album. Take a listen